Towards “Cranfield” Test Collections for Personal Data Search Evaluation

نویسندگان

  • Liadh Kelly
  • Gareth J.F. Jones
چکیده

Desktop archives are distinct from sources for which shared “Cranfield” information retrieval test collectionshave been created to date. Differences associated with desktop collections include: they are personal to the archive owner, the owner has personal memories about the items contained within them, and only the collection owner can rate the relevance of items retrieved in response to their query. In this paper we discuss these unique attributes of desktop collections and search, and the resulting challenges associated with creating test collections for desktop search. We also outline a proposed strategy for creating test collections for this space.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Strategy for Evaluating Search of “Real” Personal Information Archives

Personal information archives (PIAs) can include materials from many sources, e.g. desktop and laptop computers, mobile phones, etc. Evaluation of personal search over these collections is problematic for reasons relating to the personal and private nature of the data and associated information needs and measuring system response effectiveness. Conventional information retrieval (IR) evaluation...

متن کامل

ECIR WORKSHOP REPORT Workshop on Evaluating Personal Search

The first ECIR workshop on Evaluating Personal Search was held on 18 April 2011 in Dublin, Ireland. The workshop consisted of 6 oral paper presentations and several discussion sessions. This report presents an overview of the scope and contents of the workshop and outlines the major outcomes. 1 0BIntroduction Personal Search (PS) refers to the process of searching within one’s personal space of...

متن کامل

A Laboratory-Based Method for the Evaluation of Personalised Search

Comparative evaluation of Information Retrieval Systems (IRSs) using publically available test collections has become an established practice in Information Retrieval (IR). By means of the popular Cranfield evaluation paradigm IR test collections enable researchers to compare new methods to existing approaches. An important area of IR research where this strategy has not been applied to date is...

متن کامل

The Philosophy of Information Retrieval Evaluation

Evaluation conferences such as TREC, CLEF, and NTCIR are modern examples of the Cranfield evaluation paradigm. In Cranfield, researchers perform experiments on test collections to compare the relative effectiveness of different retrieval approaches. The test collections allow the researchers to control the effects of different system parameters, increasing the power and decreasing the cost of r...

متن کامل

Philosophy of IR Evaluation

• System evaluation: how good are document rankings? • User-based evaluation: how satisfied is user? NIST Why do system evaluation? • Allows sufficient control of variables to increase power of comparative experiments – laboratory tests less expensive – laboratory tests more diagnostic – laboratory tests necessarily an abstraction • It works! – numerous examples of techniques developed in the l...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011